A systematic review on overfitting control in shallow and deep neural networks
نویسندگان
چکیده
Intelligent Transportation Systems (ITS) are much correlated with data science mechanisms. Among the different correlation branches, this paper focuses on neural network learning models. Some of considered models shallow and they get some user-defined features learn relationship, while deep extract necessary before by themselves. Both these paradigms utilized in recent intelligent transportation systems to support decision-making aid operations such as frequent patterns mining, regression, clustering, classification. When learners cannot generalize results just memorize training samples, fail necessities. In cases, testing error is bigger than error. This phenomenon addressed overfitting literature. Because, issue decreases reliability systems, ITS applications, we use over-fitted machine for tasks traffic prediction, signal controlling, safety emergency responses, mode detection, driving evaluation, etc. Besides, a great number hyper-parameters, more attention. To solve problem, regularized can be followed. The aim review approaches presented regularize categories studies. Then, give case study that uses version convolutional (CNN).
منابع مشابه
Porosity classification from thin sections using image analysis and neural networks including shallow and deep learning in Jahrum formation
The porosity within a reservoir rock is a basic parameter for the reservoir characterization. The present paper introduces two intelligent models for identification of the porosity types using image analysis. For this aim, firstly, thirteen geometrical parameters of pores of each image were extracted using the image analysis techniques. The extracted features and their corresponding pore types ...
متن کاملStacked Training for Overfitting Avoidance in Deep Networks
When training deep networks and other complex networks of predictors, the risk of overfitting is typically of large concern. We examine the use of stacking, a method for training multiple simultaneous predictors in order to simulate the overfitting in early layers of a network, and show how to utilize this approach for both forward training and backpropagation learning in deep networks. We then...
متن کاملReducing Overfitting in Deep Networks by Decorrelating Representations
One major challenge in training Deep Neural Networks is preventing overfitting. Many techniques such as data augmentation and novel regularizers such as Dropout have been proposed to prevent overfitting without requiring a massive amount of training data. In this work, we propose a new regularizer called DeCov which leads to significantly reduced overfitting (as indicated by the difference betw...
متن کاملOn the complexity of shallow and deep neural network classifiers
Recently, deep networks were proved to be more effective than shallow architectures to face complex real–world applications. However, theoretical results supporting this claim are still few and incomplete. In this paper, we propose a new topological measure to study how the depth of feedforward networks impacts on their ability of implementing high complexity functions. Upper and lower bounds o...
متن کاملDropout: a simple way to prevent neural networks from overfitting
Deep neural nets with a large number of parameters are very powerful machine learning systems. However, overfitting is a serious problem in such networks. Large networks are also slow to use, making it difficult to deal with overfitting by combining the predictions of many different large neural nets at test time. Dropout is a technique for addressing this problem. The key idea is to randomly d...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Artificial Intelligence Review
سال: 2021
ISSN: ['0269-2821', '1573-7462']
DOI: https://doi.org/10.1007/s10462-021-09975-1